Goto

Collaborating Authors

 alibaba cloud


AGENTIQL: An Agent-Inspired Multi-Expert Framework for Text-to-SQL Generation

Heidari, Omid Reza, Reid, Siobhan, Yaakoubi, Yassine

arXiv.org Artificial Intelligence

LLMs have advanced text-to-SQL generation, yet monolithic architectures struggle with complex reasoning and schema diversity. We propose AGENTIQL, an agent-inspired multi-expert framework that combines a reasoning agent for question decomposition, a coding agent for sub-query generation, and a refinement step for column selection. An adaptive router further balances efficiency and accuracy by selecting between our modular pipeline and a baseline parser. Several steps in the pipeline can be executed in parallel, making the framework scalable to larger workloads. Evaluated on the Spider benchmark, AGENTIQL improves execution accuracy and interpretability and achieves up to 86.07% EX with 14B models using the Planner&Executor merging strategy. The attained performance is contingent upon the efficacy of the routing mechanism, thereby narrowing the gap to GPT-4-based SOTA (89.65% EX) while using much smaller open-source LLMs. Beyond accuracy, AGENTIQL enhances transparency by exposing intermediate reasoning steps, offering a robust, scalable, and interpretable approach to semantic parsing.


DeepSeek Has Rattled the AI Industry. Here's a Look at Other Chinese AI Models

TIME - Tech

The Chinese artificial intelligence firm DeepSeek has rattled markets with claims that its latest AI model, R1, performs on a par with those of OpenAI, despite using less advanced computer chips and consuming less energy. DeepSeek's emergence has raised concerns that China may have overtaken the U.S. in the artificial intelligence race despite restrictions on its access to the most advanced chips. Like the U.S., China is investing billions into artificial intelligence. Last week, it created a 60 billion yuan ( 8.2 billion) AI investment fund, days after the U.S. imposed fresh chip export restrictions. Beijing has also invested heavily in the semiconductor industry to build its capacity to make advanced computer chips, working to overcome limits on its access to those of industry leaders.


4 Companies Leading The Rise Of Artificial Intelligence And Big Data

#artificialintelligence

The article highlights four companies in the Global X Artificial Intelligence & Technology ETF (AIQ) that are shaping the growth path of technologies along the AI value chain.


Integrating AI with SaaS-based Cloud Data Warehouses

#artificialintelligence

This article discusses the definition of SaaS-based cloud data warehouse integrated with AI, by Meng Shuo, MaxCompute product manager of the Alibaba Cloud business unit. Artificial intelligence (AI) is a concept that emerged as early as the 1950s. After that, due to various reasons, AI experienced a long process of dormancy for decades. It is not until the last few years that AI became popular again. In fact, AI has actually enjoyed three "golden periods" of development in its history.


Saudi Arabia signs artificial intelligence agreements

#artificialintelligence

The agreements followed the announcement of Saudi Arabia's National Strategy for Data and Artificial Intelligence, launched during the Global AI Summit Saudi Arabia has signed a series of partnership agreements with international tech companies to advance artifical intelligence (AI) in the kingdom. The agreements, which were rigned at the virtual Global AI Summit held in Riyadh, are underpinned by Saudi Arabia's newly-launched National Strategy for Data and Artificial Intelligence (NSDAI). Saudi Arabia's National Center for Artificial Intelligence (NCAI) announced a memorandum of understanding (MoU) with China's Huawei to enable strategic cooperation on the kingdom's National AI Capability Development Program. Under the MoU, Huawei will support the NCAI to train Saudi AI engineers and students, and to address Arabic language AI-related capabilities. NCAI and Huawei will also explore the creation of an AI Capability Platform to localise technology solutions.


Climate Change & AI for GOOD

#artificialintelligence

Join Data Natives for a discussion on how to curb Climate Change and better protect our environment for the next generation. Get inspired by innovative solutions which use data, machine learning and AI technologies for GOOD. Lubomila Jordanova, Founder of Plan A, and featured speaker, explains that "the IT sector will use up to 51% of the global energy output in 2030. Let's adjust the digital industry and use Data for Climate Action, because carbon reduction is key to making companies future-proof." When used carefully, AI can help us solve some of the most serious challenges.


Flow Rate Control in Smart District Heating Systems Using Deep Reinforcement Learning

Zhang, Tinghao, Luo, Jing, Chen, Ping, Liu, Jie

arXiv.org Artificial Intelligence

At high latitudes, many cities adopt a centralized heating system to improve the energy generation efficiency and to reduce pollution. In multi-tier systems, so-called district heating, there are a few efficient approaches for the flow rate control during the heating process. In this paper, we describe the theoretical methods to solve this problem by deep reinforcement learning and propose a cloud-based heating control system for implementation. A real-world case study shows the effectiveness and practicability of the proposed system controlled by humans, and the simulated experiments for deep reinforcement learning show about 1985.01 gigajoules of heat quantity and 42276.45 tons of water are saved per hour compared with manual control.


Alibaba Cloud opens source code for machine-learning platform Alink · TechNode

#artificialintelligence

Alibaba Cloud, a subsidiary of Chinese e-commerce giant Alibaba, has opened to the public its source code for an in-house machine-learning platform that it used to drive product recommendations during this year's Singles Day shopping festival. Why it matters: Alibaba has sharpened its focus on open-source software since 2011. The company's cloud division is a member of the Linux Foundation and is active in a number of open-source communities including the Apache Software Foundation. Details: Dubbed Alink, the platform offers a range of algorithm libraries that allow for processing live data as well as batched datasets, Alibaba Cloud said in a statement on Thursday. "The difference between Alink and the pure AI platforms like Tensorflow and PyTorch is that those focus more on the algorithms and the design of the models. But today, for machine learning models to train effectively, we need to have a high-quality connection to the big data. Alink provides us with a seamless connection between the AI algorithms and the big data distributed systems."


PyTorch 1.3 adds mobile, privacy, quantization, and named tensors

#artificialintelligence

PyTorch continues to gain momentum because of its focus on meeting the needs of researchers, its streamlined workflow for production use, and most of all because of the enthusiastic support it has received from the AI community. PyTorch citations in papers on ArXiv grew 194 percent in the first half of 2019 alone, as noted by O'Reilly, and the number of contributors to the platform has grown more than 50 percent over the last year, to nearly 1,200. Facebook, Microsoft, Uber, and other organizations across industries are increasingly using it as the foundation for their most important machine learning (ML) research and production workloads. We are now advancing the platform further with the release of PyTorch 1.3, which includes experimental support for features such as seamless model deployment to mobile devices, model quantization for better performance at inference time, and front-end improvements, like the ability to name tensors and create clearer code with less need for inline comments. We're also launching a number of additional tools and libraries to support model interpretability and bringing multimodal research to production.


Part 2: Image Classification using Features Extracted by Transfer Learning in Keras

#artificialintelligence

Part 1 discussed the traditional machine learning (ML) pipeline and highlighted that manual feature extraction is not the right choice for working with large datasets. On the other hand, deep learning (DL) able to automatically extract features from such large datasets. Part 1 also introduced transfer learning to highlight its benefits for making it possible to use DL for small datasets by transferring the learning of a pre-trained model. In this tutorial, which is Part 2 of the series, we will start the first practical side of the project. This is by starting working with creating a Jupyter notebook and making sure everything is up and running. After that, the Fruits360 dataset is downloaded using Keras within the Jupyter notebook. After making sure the dataset is downloaded successfully, its training and test images are read into NumPy arrays which will be fed later to MobileNet for extracting features. This series uses the Jyputer notebook for transfer learning of the pre-trained MobileNet.